233 research outputs found

    An algorithm for the systematic disturbance of optimal rotational solutions

    Get PDF
    An algorithm for introducing a systematic rotational disturbance into an optimal (i.e., single axis) rotational trajectory is described. This disturbance introduces a motion vector orthogonal to the quaternion-defined optimal rotation axis. By altering the magnitude of this vector, the degree of non-optimality can be controlled. The metric properties of the distortion parameter are described, with analogies to two-dimensional translational motion. This algorithm was implemented in a motion-control program on a three-dimensional graphic workstation. It supports a series of human performance studies on the detectability of rotational trajectory optimality by naive observers

    Time-to-Passage Judgments in Nonconstant Optical Flow Fields

    Get PDF
    The time until an approaching object will pass an observer (time to passage, or TTP) is optically specified by a global flow field even in the absence of local expansion or size cues. Kaiser and Mowafy have demonstrated that observers are in fact sensitive to this global flow information. The present studies investigate two factors that are usually ignored in work related to TTP: (1) non-constant motion functions and (2) concomitant eye rotation. Non-constant velocities violate an assumption of some TTP derivations, and eye rotations may complicate heading extraction. Such factors have practical significance, for example, in the case of a pilot accelerating an aircraft or executing a roll. In our studies, a flow field of constant-sized stars was presented monocularly on a large screen. TIP judgments had to be made on the basis of one target star. The flow field varied in its acceleration pattern and its roll component. Observers did not appear to utilize acceleration information. In particular, TTP with decelerating motion were consistently underestimated. TTP judgments were fairly robust with respect to roll, even when roll axis and track vector were decoupled. However, substantial decoupling between heading and track vector led to a decrement in performance, in both the presence and the absence of roll

    Cueing light configuration for aircraft navigation

    Get PDF
    A pattern of light is projected from multiple sources located on an aircraft to form two clusters. The pattern of each cluster changes as the aircraft flies above and below a predetermined nominal altitude. The initial patterns are two horizontal, spaced apart lines. Each is capable of changing to a delta formation as either the altitude or the terrain varies. The direction of the delta cues the pilot as to the direction of corrective action

    Visual information for judging temporal range

    Get PDF
    Work in our laboratory suggests that pilots can extract temporal range information (i.e., the time to pass a given waypoint) directly from out-the-window motion information. This extraction does not require the use of velocity or distance, but rather operates solely on a 2-D motion cue. In this paper, we present the mathematical derivation of this information, psychophysical evidence of human observers' sensitivity, and possible advantages and limitations of basing vehicle control on this parameter

    Relating Standardized Visual Perception Measures to Simulator Visual System Performance

    Get PDF
    Human vision is quantified through the use of standardized clinical vision measurements. These measurements typically include visual acuity (near and far), contrast sensitivity, color vision, stereopsis (a.k.a. stereo acuity), and visual field periphery. Simulator visual system performance is specified in terms such as brightness, contrast, color depth, color gamut, gamma, resolution, and field-of-view. How do these simulator performance characteristics relate to the perceptual experience of the pilot in the simulator? In this paper, visual acuity and contrast sensitivity will be related to simulator visual system resolution, contrast, and dynamic range; similarly, color vision will be related to color depth/color gamut. Finally, we will consider how some characteristics of human vision not typically included in current clinical assessments could be used to better inform simulator requirements (e.g., relating dynamic characteristics of human vision to update rate and other temporal display characteristics)

    Perspective Imagery in Synthetic Scenes used to Control and Guide Aircraft during Landing and Taxi: Some Issues and Concerns

    Get PDF
    Perspective synthetic displays that supplement, or supplant, the optical windows traditionally used for guidance and control of aircraft are accompanied by potentially significant human factors problems related to the optical geometric conformality of the display. Such geometric conformality is broken when optical features are not in the location they would be if directly viewed through a window. This often occurs when the scene is relayed or generated from a location different from the pilot s eyepoint. However, assuming no large visual/vestibular effects, a pilot cad often learn to use such a display very effectively. Important problems may arise, however, when display accuracy or consistency is compromised, and this can usually be related to geometrical discrepancies between how the synthetic visual scene behaves and how the visual scene through a window behaves. In addition to these issues, this paper examines the potentially critical problem of the disorientation that can arise when both a synthetic display and a real window are present in a flight deck, and no consistent visual interpretation is available

    Choosing Your Poison: Optimizing Simulator Visual System Selection as a Function of Operational Tasks

    Get PDF
    Although current technology simulator visual systems can achieve extremely realistic levels they do not completely replicate the experience of a pilot sitting in the cockpit, looking at the outside world. Some differences in experience are due to visual artifacts, or perceptual features that would not be present in a naturally viewed scene. Others are due to features that are missing from the simulated scene. In this paper, these differences will be defined and discussed. The significance of these differences will be examined as a function of several particular operational tasks. A framework to facilitate the choice of visual system characteristics based on operational task requirements will be proposed

    Depth Perception, Cueing, and Control

    Get PDF
    Humans rely on a variety of visual cues to inform them of the depth or range of a particular object or feature. Some cues are provided by physiological mechanisms, others from pictorial cues that are interpreted psychologically, and still others by the relative motions of objects or features induced by observer (or vehicle) motions. These cues provide different levels of information (ordinal, relative, absolute) and saliency depending upon depth, task, and interaction with other cues. Display technologies used for head-down and head-up displays, as well as out-the-window displays, have differing capabilities for providing depth cueing information to the observeroperator. In addition to technologies, display content and the source (camera sensor versus computer rendering) provide varying degrees of cue information. Additionally, most displays create some degree of cue conflict. In this paper, visual depth cues and their interactions will be discussed, as well as display technology and content and related artifacts. Lastly, the role of depth cueing in performing closed-loop control tasks will be discussed

    Proceedings of the Augmented VIsual Display (AVID) Research Workshop

    Get PDF
    The papers, abstracts, and presentations were presented at a three day workshop focused on sensor modeling and simulation, and image enhancement, processing, and fusion. The technical sessions emphasized how sensor technology can be used to create visual imagery adequate for aircraft control and operations. Participants from industry, government, and academic laboratories contributed to panels on Sensor Systems, Sensor Modeling, Sensor Fusion, Image Processing (Computer and Human Vision), and Image Evaluation and Metrics

    Visually Guided Control of Movement

    Get PDF
    The papers given at an intensive, three-week workshop on visually guided control of movement are presented. The participants were researchers from academia, industry, and government, with backgrounds in visual perception, control theory, and rotorcraft operations. The papers included invited lectures and preliminary reports of research initiated during the workshop. Three major topics are addressed: extraction of environmental structure from motion; perception and control of self motion; and spatial orientation. Each topic is considered from both theoretical and applied perspectives. Implications for control and display are suggested
    corecore